Lowe Down

Instrumental To Success

By: Derek Lowe

Contributing Editor

I’m a chemist, so I tend to see the drug discovery world through a chemocentric filter. In my first few years, it was easy for me to believe that medicinal chemists were the key to the whole enterprise. (One good thing about this industry is that there are so many parts to it that everyone can convince themselves that their function is the most important). But over time, I found that discussions between chemists and biologists about which function is more vital was like listening to tires arguing with an engine: you need both of them, and claims of priority are silly.

But there’s another area that I think is sneaking up on all of us, one that’s already important but which is ready to become much more so: instrumentation. I say that because several trends are all pointing in the same direction. Since data acquisition and analysis are driven by computational power, there’s that broad rise in capability shown by everything in that sector. The push for nanotechnology is starting to have a real effect, too, since most of the things we’re interested in happen on that scale. Taken together, these have led to some older techniques — like mass spectroscopy — showing alarming abilities to rejuvenate themselves, and to some newer ones that promise to do things we’ve never been able to do at all.

I can remember being impressed when I first heard of an HPLC being hooked up to a mass-spec readout, and the combination has ruled the world ever since. But every few months, something new gets added to what you’d think would be a well-worked-out technology. New ionization methods keep getting discovered, which are taking us toward a world where mass-spec probes can deal with larger and larger molecules under steadily less artificial conditions. We’re getting close to being able to get complete mass-spec readouts directly from samples like drops of blood, with little or no sample preparation. And on another front, using the technique as a platform for imaging has long been in the if-you-really-want-to realm, but it’s becoming more routinely feasible all the time. That’s already leading to some interesting applications, but there are a lot more to come. For example, I’m waiting for the first report of a microscale mass-spec image of the surface of a single cell. No one can do that yet — at least, I don’t think anyone can — but it’s clearly just a matter of time.

Outside of mass spec, imaging in general has been coming on strong for some years now, and it’s not getting any less useful. Fluorescent labels are getting smaller and more versatile, and the microscopy techniques used to see them are starting to show temporal and spatial resolutions that once were thought to be basically impossible. The same goes for NMR and PET imaging — in fact, NMR is another good example of another technology that is refusing to age. Its sensitivity and resolution, especially for biological applications, is moving right along. I keep waiting for it to get to the point where it can make a real impact on traditional PK measurements, and that probably isn’t too far off.

Combining these various techniques will lead to some very odd-looking instruments showing up in a few years, no doubt. Somewhere, someone is soldering one together right now. No matter what the contraption, all of these methods seem to be following a progression that goes something like this: prepared artificial samples -> purified biological samples -> prepared tissues -> live tissues -> live whole animals -> single living cells.

You can draw a graph of the amount of data that they generate as time goes on, too, and it’ll curl your hair. That’s the troublesome side of all this wonderful stuff: for some time to come, it’s going to generate more data than we know what to do with. If there’s an iron law of scientific instrumentation, it’s that the closer you look, the more you see, and we’re set to see more than we ever have. The prospect fills me with simultaneous excitement and dread, which is a mixture that we seem to get more than our share of.

This should be a familiar story to people who work with genetic variability and expression, to pick a field that’s been through the process recently. Advances in sequencing and assay techniques have generated an overwhelming flood of data over the last few years, while our ability to understand and use these results hasn’t kept pace. OK, maybe “understand and use” is a little generous, as is the phrase “hasn’t kept pace.” It might be more accurate to say that our ability to dog-paddle on top of the torrents of sequence and gene-chip data just managed to keep everyone from being completely swept away. And that’s just with the valid studies, mind you. A lot of the early data in the field was generated under idiosyncratic or poorly controlled conditions, and has to be handled rather gingerly — or, more commonly, ignored altogether in favor of re-running the whole experiments with later equipment.

We should be braced for similar experiences with these new techniques. As soon as someone cobbles together a working real-time single-cell mass spectrometer/fluorescent imager/ NMR hybrid or what have you, they’re going to start cranking out results with it, which will be splattered all over Science, Nature, and the other high-end journals. The instrumental design will take a while to settle down, so there will be some intrinsic variability as the technique gets picked up by other labs, and there will surely be factors affecting the data that no one will even realize until the machines are running all over the place. In the meantime, terabytes of data will come hosing out of the things. Some of it will be very interesting and important, some of it will be meaningless, and some of it will be outright misleading garbage, and it could be quite a long, expensive, hard-working interval before we know which parts are which.

New biology will come out of all this, of course, which in the long run has to be good news. Looking at our failure rates in the clinic, it’s clear that we need all the help we can get. Given what we know versus what we need to know, though, the long run may turn out to be longer than we (or our investors) realize. As some understanding dawns, though, the other thing that new techniques will do, inevitably, is raise the bar for drug discovery. With any luck, the move toward greater sensitivity and smaller samples will allow some of the new gadgets to be used productively in screening mode — the earlier up the discovery pipeline, the better, as long as we’re sure about what we’re seeing. The nightmare would be one or more new techniques that address something very important — human toxicology, for example — but do it in a difficult, tricky, expensive, and not-so-reliable manner. We have quite enough of that sort of thing already; spending huge piles of time and money for more does not appeal.

Derek B. Lowe has been employed since 1989 in pharmaceutical drug discovery in several therapeutic areas. His blog, In the Pipeline, is an invaluable aid to Contract Pharma.

Keep Up With Our Content. Subscribe To Contract Pharma Newsletters